Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            The cellular network has undergone rapid progress since its inception in 1980s. While rapid iteration of newer generations of cellular technology plays a key role in this evolution, the incremental and eventually wide deployment of every new technology generation also plays a vital role in delivering the promised performance improvement. In this work, we conduct the first metamorphosis study of a cellular network generation, 5G, by measuring the user-experienced 5G performance from 5G network’s birth (initial deployment) to maturity (steady state). By analyzing a 4-year 5G performance trace of 2.65M+ Ookla® Speedtest Intelligence® measurements collected in 9 cities in the United States and Europe from January 2020 to December 2023, we unveil the detailed evolution of 5G coverage, throughput, and latency at the quarterly granularity, compare the performance diversity across the 9 representative cities, and gain insights into compounding factors that affect user-experienced 5G performance, such as adoption of 5G devices and the load on the 5G network. Our study uncovers the typical life-cycle of a new cellular technology generation as it undergoes its “growing pain” towards delivering its promised QoE over the previous technology generation.more » « lessFree, publicly-accessible full text available October 15, 2026
- 
            In 2022, 3 years after the initial 5G rollout, through a cross-country US driving trip (from Los Angeles to Boston), the authors of [28] conducted an in-depth measurement study of user-perceived experience (network coverage, performance, and QoE of a set of major 5G “killer” apps) over all three major US carriers. The study revealed disappointingly low 5G coverage and suboptimal network performance – falling short of the expectations needed to support the new generation of 5G "killer apps. Now, five years into the 5G era, widely considered its midlife, 5G networks are expected to deliver stable and mature performance. In this work, we replicate the 2022 study along the same coast-to-coast route, evaluating the current state of cellular coverage and network and application performance across all three major US operators. While we observe a substantial increase in 5G coverage and a corresponding boost in network performance, two out of three operators still exhibit less than 50% 5G coverage along the driving route even five years after the initial 5G rollout. We expand the scope of the previous work by analyzing key lower-layer KPIs that directly influence the network performance. Finally, we introduce a head-to-head comparison with Starlink’s LEO satellite network to assess whether emerging non-terrestrial networks (NTNs) can complement the terrestrial cellular infrastructure in the next generation of wireless connectivity.more » « lessFree, publicly-accessible full text available July 28, 2026
- 
            Free, publicly-accessible full text available February 26, 2026
- 
            Networking research has witnessed a renaissance from exploring the seemingly unlimited predictive power of machine learning (ML) models. One such promising direction is throughput prediction – accurately predicting the network bandwidth or achievable throughput of a client in real time using ML models can enable a wide variety of network applications to proactively adapt their behavior to the changing network dynamics to potentially achieve significantly improved QoE. Motivated by the key role of newer generations of cellular networks in supporting the new generation of latency-critical applications such as AR/MR, in this work, we focus on accurate throughput prediction in cellular networks at fine time-scales, e.g., in the order of 100 ms. Through a 4-day, 1000+ km driving trip, we collect a dataset of fine-grained throughput measurements under driving across all three major US operators. Using the collected dataset, we conduct the first feasibility study of predicting fine-grained application throughput in real-world cellular networks with mixed LTE/5G technologies. Our analysis shows that popular ML models previously claimed to predict well for various wireless networks scenarios (e.g., WiFi or singletechnology network such as LTE only) do not predict well under app-centric metrics such as ARE95 and PARE10. Further, we uncover the root cause for the poor prediction accuracy of ML models as the inherent conflicting sample sequences in the fine-grained cellular network throughput data.more » « less
- 
            Networking research has witnessed a renaissance from exploring the seemingly unlimited predictive power of machine learning (ML) models. One such promising direction is throughput prediction – accurately predicting the network bandwidth or achievable throughput of a client in real time using ML models can enable a wide variety of network applications to proactively adapt their behavior to the changing network dynamics to potentially achieve significantly improved QoE. Motivated by the key role of newer generations of cellular networks in supporting the new generation of latency-critical applications such as AR/MR, in this work, we focus on accurate throughput prediction in cellular networks at fine time-scales, e.g., in the order of 100 ms. Through a 4-day, 1000+ km driving trip, we collect a dataset of fine-grained throughput measurements under driving across all three major US operators. Using the collected dataset, we conduct the first feasibility study of predicting fine-grained application throughput in real-world cellular networks with mixed LTE/5G technologies. Our analysis shows that popular ML models previously claimed to predict well for various wireless networks scenarios (e.g., WiFi or singletechnology network such as LTE only) do not predict well under app-centric metrics such as ARE95 and PARE10. Further, we uncover the root cause for the poor prediction accuracy of ML models as the inherent conflicting sample sequences in the finegrained cellular network throughput data.more » « less
- 
            Free, publicly-accessible full text available November 1, 2026
- 
            After a rapid deployment worldwide over the past few years, 5G is expected to have reached a mature deployment stage to provide measurable improvement of network performance and user experience over its predecessors. In this study, we aim to assess 5G deployment maturity via three conditions: (1) Does 5G performance remain stable over a long time span? (2) Does 5G provide better performance than its predecessor LTE? (3) Does the technology offer similar performance across diverse geographic areas and cellular operators? We answer this important question by conducting a cross-sectional, year-long measurement study of 5G uplink performance. Leveraging a custom Android App, we collected 5G uplink performance measurements (of critical importance to latency-critical apps) spanning 8 major cities in 7 countries and two different continents. Our measurements show that 5G deployment in major cities appears to have matured, with no major performance improvements observed over a one-year period, but 5G does not provide consistent, superior measurable performance over LTE, especially in terms of latency, and further there exists clear uneven 5G performance across the 8 cities. Our study suggests that, while 5G deployment appears to have stagnated, it is short of delivering its promised performance and user experience gain over its predecessor.more » « less
- 
            One of the key enhancements in the upcoming 802.11ay standard for 60 GHz WLANs is the support for simultaneous transmissions of up to 8 data streams via SU- and MU-MIMO, which has the potential to enable data rates up to 100 Gbps. However, in spite of the key role MIMO is expected to play in 802.11ay, experimental evaluation of MIMO performance in 60 GHz WLANs has been limited to date, primarily due to lack of hardware supporting MIMO transmissions at millimeter wave frequencies. In this work, we fill this gap by conducting the first large-scale experimental evaluation of SU- and MU-MIMO performance in 60 GHz WLANs. Unlike previous studies, our study involves multiple environments with very different multipath characteristics. We analyze the performance in each environment, identify the factors that affect it, and compare it against the performance of SISO. Further, we seek to identify factors that can guide beam and user selection to limit the (often prohibitive in practice) overhead of exhaustive search. Finally, we propose two heuristics that perform both user and beam selection with low overhead, and show that they perform close to an Oracle solution and outperform previously proposed approaches in both static and mobile scenarios, regardless of the environment and number of users.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available